Probabilistic Rank-One Matrix Analysis with Concurrent Regularization
نویسندگان
چکیده
As a classical subspace learning method, Probabilistic PCA (PPCA) has been extended to several bilinear variants for dealing with matrix observations. However, they are all based on the Tucker model, leading to a restricted subspace representation and the problem of rotational ambiguity. To address these problems, this paper proposes a bilinear PPCA method named as Probabilistic Rank-One Matrix Analysis (PROMA). PROMA is based on the CP model, which leads to a more flexible subspace representation and does not suffer from rotational ambiguity. For better generalization, concurrent regularization is introduced to regularize the whole matrix subspace, rather than column and row factors separately. Experiments on both synthetic and real-world data demonstrate the superiority of PROMA in subspace estimation and classification as well as the effectiveness of concurrent regularization in regularizing bilinear PPCAs.
منابع مشابه
Probabilistic Low-Rank Matrix Completion with Adaptive Spectral Regularization Algorithms
We propose a novel class of algorithms for low rank matrix completion. Our approach builds on novel penalty functions on the singular values of the low rank matrix. By exploiting a mixture model representation of this penalty, we show that a suitably chosen set of latent variables enables to derive an ExpectationMaximization algorithm to obtain a Maximum A Posteriori estimate of the completed l...
متن کاملTheoretical Analysis of Bayesian Matrix Factorization
Recently, variational Bayesian (VB) techniques have been applied to probabilistic matrix factorization and shown to perform very well in experiments. In this paper, we theoretically elucidate properties of the VB matrix factorization (VBMF) method. Through finite-sample analysis of the VBMF estimator, we show that two types of shrinkage factors exist in the VBMF estimator: the positive-part Jam...
متن کاملRank-One Matrix Completion with Automatic Rank Estimation via L1-Norm Regularization
Completing a matrix from a small subset of its entries, i.e., matrix completion, is a challenging problem arising from many real-world applications, such as machine learning and computer vision. One popular approach to solving the matrix completion problem is based on low-rank decomposition/factorization. Low-rank matrix decomposition-based methods often require a pre-specified rank, which is d...
متن کاملTutorial on Probabilistic Topic Modeling: Additive Regularization for Stochastic Matrix Factorization
Probabilistic topic modeling of text collections is a powerful tool for statistical text analysis. In this tutorial we introduce a novel non-Bayesian approach, called Additive Regularization of Topic Models. ARTM is free of redundant probabilistic assumptions and provides a simple inference for many combined and multi-objective topic models.
متن کاملSurvey on Probabilistic Models of Low-Rank Matrix Factorizations
Low-rank matrix factorizations such as Principal Component Analysis (PCA), Singular Value Decomposition (SVD) and Non-negative Matrix Factorization (NMF) are a large class of methods for pursuing the low-rank approximation of a given data matrix. The conventional factorization models are based on the assumption that the data matrices are contaminated stochastically by some type of noise. Thus t...
متن کامل